Bilinear Generalized Approximate Message Passing - Part I: Derivation
نویسندگان
چکیده
In this paper, we extend the generalized approximate message passing (G-AMP) approach, originally proposed for high-dimensional generalized-linear regression in the context of compressive sensing, to the generalized-bilinear case, which enables its application to matrix completion, robust PCA, dictionary learning, and related matrix-factorization problems. Here, in Part I of a two-part paper, we derive our Bilinear G-AMP (BiG-AMP) algorithm as an approximation of the sum-product belief propagation algorithm in the high-dimensional limit, where central-limit theorem arguments and Taylor-series approximations apply, and under the assumption of statistically independent matrix entries with known priors. In addition, we propose an adaptive damping mechanism that aids convergence under finite problem sizes, an expectation-maximization (EM)-based method to automatically tune the parameters of the assumed priors, and two rank-selection strategies. In Part II of the paper, we will discuss the specializations of EM-BiG-AMP to the problems of matrix completion, robust PCA, and dictionary learning, and we will present the results of an extensive empirical study comparing EM-BiG-AMP to state-of-the-art algorithms on each problem.
منابع مشابه
Bilinear Generalized Approximate Message Passing - Part II: Applications
In this paper, we extend the generalized approximate message passing (G-AMP) approach, originally proposed for highdimensional generalized-linear regression in the context of compressive sensing, to the generalized-bilinear case. In Part I of this two-part paper, we derived our Bilinear G-AMP (BiG-AMP) algorithm as an approximation of the sum-product belief propagation algorithm in the high-dim...
متن کاملBilinear Generalized Approximate Message Passing
We extend the generalized approximate message passing (G-AMP) approach, originally proposed for highdimensional generalized-linear regression in the context of compressive sensing, to the generalized-bilinear case, which enables its application to matrix completion, robust PCA, dictionary learning, and related matrix-factorization problems. In the first part of the paper, we derive our Bilinear...
متن کاملHyperspectral image unmixing via bilinear generalized approximate message passing
In hyperspectral unmixing, the objective is to decompose an electromagnetic spectral dataset measured over M spectral bands and T pixels, into N constituent material spectra (or “endmembers”) with corresponding spatial abundances. In this paper, we propose a novel approach to hyperspectral unmixing (i.e., joint estimation of endmembers and abundances) based on loopy belief propagation. In parti...
متن کاملApproximate Message Passing for Bilinear Models
Approach: We take a Bayesian approach to the inference problems (in particular, posterior estimation) that revolve around the bilinear model (1). In particular, we leverage the approximate message passing (AMP) framework of [2], [3] and extend it to the bilinear domain. Compared to Bayesian approaches that rely on Gibbs sampling methods or variational inference, the AMP framework allows us to f...
متن کاملA Matching Pursuit Generalized Approximate Message Passing Algorithm
This paper proposes a novel matching pursuit generalized approximate message passing (MPGAMP) algorithm which explores the support of sparse representation coefficients step by step, and estimates the mean and variance of non-zero elements at each step based on a generalized-approximate-message-passing-like scheme. In contrast to the classic message passing based algorithms and matching pursuit...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- IEEE Trans. Signal Processing
دوره 62 شماره
صفحات -
تاریخ انتشار 2014